using LinearAlgebra, Printf, LaTeXStrings, Plots, Elliptic.Jacobi
import Base.isless
function isless(a::ComplexF64,b::ComplexF64)
return imag(a) < imag(b)
end
isless (generic function with 56 methods)
sort([1.0+1im, 1.0 + 2.0im])
2-element Vector{ComplexF64}:
1.0 + 1.0im
1.0 + 2.0im
Recall the test problem
$$v'''(t) + v'(t)v(t) - \frac{\beta_1 + \beta_2 + \beta_3}{3}v'(t) = 0,$$where $\beta_1 < \beta_2 < \beta_3$. It follows that
$$v(t) = \beta_2 + \left(\beta_3 - \beta_2\right)\text{cn}^2\left(\sqrt{\frac{\beta_3 - \beta_1}{12}}t, \sqrt{\frac{\beta_3 - \beta_2}{\beta_3 - \beta_1}}\right)$$is a solution where $\text{cn}(x, k)$ is the Jacobi cosine function and $k$ is the elliptic modulus. Some notations use $\text{cn}(x, m)$ where $m = k^2$. The corresponding initial conditions are
$$v(0) = \beta_3, \quad v'(0) = 0, \quad v''(0) = -\frac{\left(\beta_3 - \beta_1\right)\left(\beta_3 - \beta_2\right)}{6}.$$Let's write the equation as a system and compute the Jacobian. For $\beta_1 = 0$, $\beta_2 = 1$, and $\beta_3 = 10$, based on an analysis of the Jacobian, we will see if we can suggest methods to solve the problem.
Letting $u_1 = v(t)$, $u_2 = v'(t)$, and $u_3 = v''(t)$, we can express the nonlinear system as $u' = f(u)$ where
$$f(u) = \begin{bmatrix} u_2 \\ u_3 \\ \frac{\beta_1 + \beta_2 + \beta_3}{3}u_2 - u_1u_2 \end{bmatrix}.$$The Jacobian, $f'(u)$, is then given by
$$\begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ -u_2 & \frac{\beta_1 + \beta_2 + \beta_3}{3} - u_1 & 0 \end{bmatrix}.$$The range of possible eigenvalues for different values of $t$ are plotted below.
β₁ = -1.
β₂ = 1.
β₃ = 2.
c = (β₁ + β₂ + β₃)/3
vcn = t -> cn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
vsn = t -> sn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
vdn = t -> dn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
function Jacobian(t)
v = β₂ + (β₃ - β₂)*vcn(t)^2
dv = -2*(β₃ - β₂)*vcn(t)*vsn(t)*vdn(t)
return [0.0 1.0 0.0; 0.0 0.0 1.0; -dv c-v 0.0]
end
t = 0:.001:40;
λs = map( t -> eigvals(Jacobian(t)), t)
λs = [ sort(i) for i in λs]
λ1 = [i[1] for i in λs][1:10:end]
λ2 = [i[2] for i in λs][1:10:end]
λ3 = [i[3] for i in λs][1:10:end];
p = scatter(λ1 |> real, λ1 |> imag, markercolor = :blue, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_1");
scatter!(λ2 |> real, λ2 |> imag, markercolor = :green, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_2");
scatter!(λ3 |> real, λ3 |> imag, markercolor = :red, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_3")
savefig(p,"jacobian_evals_1.pdf")
Due to the location of the eigenvalues it is not possible (see HW3!) to make it so that all the eigenvalues lie within the stability region for any method. So, it is about maximizing. With something like trapezoid, the eigenvalues are in the stability region half of the time. With backward Euler, this is true of y2, but the others are within the stability region more than half of the time.
Something like leapfrog would be a bad choice here.
β₁ = 0.
β₂ = 1.
β₃ = 10.
c = (β₁ + β₂ + β₃)/3
vcn = t -> cn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
vsn = t -> sn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
vdn = t -> dn(sqrt((β₃-β₁)/12)*t, (β₃-β₂)/(β₃-β₁) )
function Jacobian(t)
v = β₂ + (β₃ - β₂)*vcn(t)^2
dv = -2*(β₃ - β₂)*vcn(t)*vsn(t)*vdn(t)
return [0.0 1.0 0.0; 0.0 0.0 1.0; -dv c-v 0.0]
end
t = 0:.001:40;
λs = map( t -> eigvals(Jacobian(t)), t)
λs = [ sort(i) for i in λs]
λ1 = [i[1] for i in λs][1:10:end]
λ2 = [i[2] for i in λs][1:10:end]
λ3 = [i[3] for i in λs][1:10:end];
p = scatter(λ1 |> real, λ1 |> imag, markercolor = :blue, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_1");
scatter!(λ2 |> real, λ2 |> imag, markercolor = :green, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_1");
scatter!(λ3 |> real, λ3 |> imag, markercolor = :red, markerstrokewidth=0, yaxis = [-3,3], xaxis = [-2,2], label = L"\lambda_1")
savefig(p,"jacobian_evals_2.pdf")
The method one should choose for this problem is maybe less clear. But similar considerations as those given for the previous choice of $\beta_i$'s still are good justification. But for both problems, Runge--Kutta 4 is actually probably the best choice -- one-step, accurate, and for sufficiently small step size the eigenvalues will be within the stability region approximately half the time.